Goto

Collaborating Authors

 combining classifier


Combining Classifiers Using Correspondence Analysis

Neural Information Processing Systems

Several effective methods for improving the performance of a sin(cid:173) gle learning algorithm have been developed recently. The general approach is to create a set of learned models by repeatedly apply(cid:173) ing the algorithm to different versions of the training data, and then combine the learned models' predictions according to a pre(cid:173) scribed voting scheme. Little work has been done in combining the predictions of a collection of models generated by many learning algorithms having different representation and/or search strategies. This paper describes a method which uses the strategies of stack(cid:173) ing and correspondence analysis to model the relationship between the learning examples and the way in which they are classified by a collection of learned models. A nearest neighbor method is then applied within the resulting representation to classify previously unseen examples.


On the Product Rule for Classification Problems

Cicconet, Marcelo

arXiv.org Machine Learning

We discuss theoretical aspects of the product rule for classification problems in supervised machine learning for the case of combining classifiers. We show that (1) the product rule arises from the MAP classifier supposing equivalent priors and conditional independence given a class; (2) under some conditions, the product rule is equivalent to minimizing the sum of the squared distances to the respective centers of the classes related with different features, such distances being weighted by the spread of the classes; (3) observing some hypothesis, the product rule is equivalent to concatenating the vectors of features. With the advance of the Machine Learning field, and the discovery of many different techniques, the subject of combining multiple learners [2] eventually drove attention, in particular the problem of combining classifiers. Many different methods appeared, and soon they were compared in terms of efficiency in solving problems. The product rule has been present in some of these works (e.g., [1, 7, 3, 6, 5, 4, 8]), in contexts ranging from the accuracy of the different combination rules to some analytical properties of the different methods.


A Privacy-Aware Bayesian Approach for Combining Classifier and Cluster Ensembles

Acharya, Ayan, Hruschka, Eduardo R., Ghosh, Joydeep

arXiv.org Machine Learning

This paper introduces a privacy-aware Bayesian approach that combines ensembles of classifiers and clusterers to perform semi-supervised and transductive learning. We consider scenarios where instances and their classification/clustering results are distributed across different data sites and have sharing restrictions. As a special case, the privacy aware computation of the model when instances of the target data are distributed across different data sites, is also discussed. Experimental results show that the proposed approach can provide good classification accuracies while adhering to the data/model sharing constraints.


Combining Classifiers Using Correspondence Analysis

Merz, Christopher J.

Neural Information Processing Systems

The challenge of this problem is to decide which models to rely on for prediction and how much weight to give each. The goal of combining learned models is to obtain a more accurate prediction than can be obtained from any single source alone.


Combining Classifiers Using Correspondence Analysis

Merz, Christopher J.

Neural Information Processing Systems

The challenge of this problem is to decide which models to rely on for prediction and how much weight to give each. The goal of combining learned models is to obtain a more accurate prediction than can be obtained from any single source alone.